Actually performing the calculations is trivial, I just have to get the data. This notebook should help me do just that. My sources are moneycontrol and Yahoo! Finance. Now Yahoo! Finance is sad because it readily gives the price data in csv format. I do not want to scrape, but looks like I have no other choice.
Extracting Balance Sheet from Moneycontrol and OHLC (open, high, low, close data) from Yahoo! Finance. Now moneycontrol is pesky, it does not allow me to save data in csv format, and because I am a lifelong linux enthusiast, I will rather die before using MS Excel. So Let me spin up a selenium instance. Or maybe try beautifulsoup.
Life Update: I had to use selenium and bit of arm twisting to scrape data from Yahoo! Finance because morons use a javascript routine to fetch data instead of using static pages. It is sad, I know.
#all imports in one place
from bs4 import BeautifulSoup
import requests
import csv
import numpy as np
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.wait import WebDriverWait
from selenium.webdriver.chrome.service import Service
from webdriver_manager.chrome import ChromeDriverManager
#at this point, I am basically copy pasting from the instagram scraper I once
#wrote, so a lot of these imports might turn out to be useless.
import pandas as pd
from plotly.subplots import make_subplots
import plotly.graph_objects as go
import matplotlib.pyplot as plt
'''
READ ONLY: You cannot really make my life more difficult Rahul, but it would be
better if you do not execute this cell.
'''
file = open("profit_and_loss_BOB.csv", "w", newline='\n')
urls = ['/1', '/2', '/3']
data = dict()
write = csv.writer(file)
# headers = ['Interest / Discount on Advances / Bills', 'Income from Investments', 'Interest on Balance with RBI and Other Inter-Bank funds'] #trial, will extract these as well inshallah
for i in urls:
r = requests.get('https://www.moneycontrol.com/financials/bankofbaroda/profit-lossVI/BOB{}#BOB'.format(i))
soup = BeautifulSoup(r.text)
z = soup.find_all('tr') #bingo we have the data. Now we just have to sanitize it.
for i in z:
if (i.text):
tds = i.find_all('td')
tds = [x.text for x in tds][:-1]
header = tds[0]
if header != '\xa0' and '' not in tds:
# print(tds)
# write.writerow(tds)
if(header in data):
data[header] += tds[1:]
else:
data[header] = tds[1:]
for i in data:
j = [i] + data[i]
write.writerow([i] + data[i])
Bingo! That was easy. Now I don't really have to save the dictionary and then go through the excruciating process of parsing JSONs everytime. Also, I guess I have enough data to do fundamental analysis now lol.
Life update: I ended up saving the data in a csv file because I wanted to look at the beautiful excel sheet I created.
Okay, so Yahoo finance does not allow scraping, that's just sad. But nothing has ever deterred the greatest of the great, Rahul Jha. They don't call me Stalker Supreme for nothing. Time to call the cavalry.
options = webdriver.ChromeOptions()
s = Service('/home/juggernautjha/Rahul/vision/Selenium/chromedriver')
driver = webdriver.Chrome(service = s, options = options)
driver.get('https://finance.yahoo.com/quote/BANKBARODA.NS/history?period1=1594857600&period2=1657929600&interval=1d&filter=history&frequency=1d&includeAdjustedClose=true')
# last_height = driver.execute_script("return document.body.scrollHeight")
# while True:
# # Scroll down to the bottom.
# driver.execute_script("window.scrollTo(0, document.body.scrollHeight);")
# # Wait to load the page.
# time.sleep(2)
# print("scrolling\n")
# # Calculate new scroll height and compare with last scroll height.
# new_height = driver.execute_script("return document.body.scrollHeight")
# if new_height == last_height:
# break
# last_height = new_height
import time
time.sleep(15)
import csv
file = open("price-history_.csv", "w")
write = csv.writer(file)
write.writerow(['Date', 'Open', 'High', 'Low', 'Close', 'Adj. Close', 'Volume'])
z = driver.find_elements(By.CSS_SELECTOR, "tr")
for i in z[-1::-1]:
f = i.text.split(" ")
try:
eval(f[2])
f = [f[0] + " " + f[1] + " " + f[2]] + f[3:]
except:
pass
print(f)
print(len(f))
write.writerow(f)
print("written")
# print("Now exporting")
# with open("yahoo-finance-cached.html", 'w') as f:
# f.write(driver.page_source)
And we are done. Now that I have the page source, i do not need to torture my RAM and use selenium for scraping. Beautifulsoup would do just fine. Also, for the love of god and everything that is holy DO NOT RUN the cell above on anything with less than a few gigabytes of RAM.
Now I have no idea what to do with EPS and PEG. I mean sure, I can calculate it but what next? I think the way to go is to scrape data for every stock in NIFTY200 (not everything, just the PE ratio and the EPS) and then compare them. Hopefully my bank will come out on top. Calculating the technical indicators should be easy (I'll see lol), and I'll just graph it. I have no idea what to do with after that.
This is where the actual magic (analysis) happens. None of this depends on the cells above. I mean yeah you'll need to run the cell with all the imports because that is a good practice, but please for the love of all things holy do not run the data collection cells. I have all the data I need.
#defining stuff
DATE = 0
OPEN = 1
HIGH = 2
LOW = 3
CLOSE = 4
hist = pd.read_csv('price-history_.csv')
hist
# fig2 = make_subplots(specs=[[{"secondary_y": True}]])
# fig2.add_trace(go.Scatter(x=hist['Date'],y=hist['Close'],name='Price'),secondary_y=False)
# fig2.add_trace(go.Bar(x=hist['Date'],y=hist['Volume'],name='Volume'),secondary_y=True)
# fig2.show()
| Date | Open | High | Low | Close | Adj. Close | Volume | |
|---|---|---|---|---|---|---|---|
| 0 | Jul 16, 2020 | 48.00 | 48.35 | 46.90 | 48.10 | 46.69 | 22261317 |
| 1 | Jul 17, 2020 | 48.25 | 49.70 | 48.05 | 49.40 | 47.96 | 23121971 |
| 2 | Jul 20, 2020 | 49.45 | 50.30 | 48.40 | 48.65 | 47.23 | 33226925 |
| 3 | Jul 21, 2020 | 49.45 | 49.85 | 48.70 | 49.40 | 47.96 | 26865913 |
| 4 | Jul 22, 2020 | 49.50 | 49.95 | 48.35 | 48.70 | 47.28 | 27424578 |
| ... | ... | ... | ... | ... | ... | ... | ... |
| 488 | Jul 11, 2022 | 104.90 | 109.95 | 104.40 | 109.55 | 109.55 | 34213586 |
| 489 | Jul 12, 2022 | 108.80 | 110.50 | 108.10 | 109.25 | 109.25 | 29735635 |
| 490 | Jul 13, 2022 | 109.50 | 110.50 | 108.15 | 108.45 | 108.45 | 15508962 |
| 491 | Jul 14, 2022 | 108.00 | 108.00 | 103.70 | 105.20 | 105.20 | 33537636 |
| 492 | Jul 15, 2022 | 106.00 | 106.40 | 103.30 | 104.05 | 104.05 | 15017788 |
493 rows × 7 columns
Look at the beautiful graph. It is interactive. The only drawback is that is uses javascript for reasons known to no one. Not me. Now, to calculate technical indicators, I can either write functions that take a dataframe and then add a column, or something. Or I can go full consultant and use readymade tools. Let me do both. This will allow me to check if I am truly a top comder.
#Now,n day averages aren't defined for n-1 days. Yup. Pandas for the win
def SMA(n):
close = hist
close['SMA'.format(n)] = close['Close'].rolling(n).mean()
close.dropna(inplace = True)
close.head()
return close
c = SMA(10)
def graph(n):
c = SMA(n)
fig3 = make_subplots(specs=[[{"secondary_y": True}]])
fig3.add_trace(go.Scatter(x=c['Date'],y=c['Close'],name='Price'),secondary_y=False)
fig3.add_trace(go.Scatter(x=c['Date'],y=c['SMA'],name='Moving Average'),secondary_y=False)
# fig3.add_trace(go.Bar(x=hist['Date'],y=hist['Volume'],name='Volume'),secondary_y=True)
fig3.show()
N-day average works splendidly. Great Success.
McGinley Dynamic, on the other hand, is a uniquely strange problem. I am unable to figure out what M_{t-1} should be for t = 1. I mean if I set M_1 = Close, let us see what we get.
rows = []
def populate_rows(): #loops on dataframes -> bad idea *retches*
with open('price-history_.csv', 'r') as f:
reader = csv.reader(f)
next(reader)
for row in reader:
z = [row[0]] + [eval(x) for x in row[1:]]
rows.append(z)
populate_rows()
def mc_ginley(rows, N):
mc_indices = [rows[0][CLOSE]]
for i in rows[1:]:
prev = mc_indices[-1]
mc_index = prev + (i[CLOSE] - prev)/(0.6* N*((i[CLOSE]/prev)**4))
mc_indices.append(mc_index)
return mc_indices
def graph_mc_ginley(rows, N):
hist = pd.read_csv("price-history_.csv")
hist['mc'] = mc_ginley(rows, N)
fig3 = make_subplots(specs=[[{"secondary_y": True}]])
fig3.add_trace(go.Scatter(x=hist['Date'],y=hist['Close'],name='Price'),secondary_y=False)
fig3.add_trace(go.Scatter(x=hist['Date'],y=hist['mc'],name='McGinley'),secondary_y=False)
# fig3.add_trace(go.Bar(x=hist['Date'],y=hist['Volume'],name='Volume'),secondary_y=True)
fig3.show()
#Comparison between moving average and mc_ginley
for I in range(10, 100, 10):
print(I)
graph(I)
graph_mc_ginley(rows, I/2)
10
20
30
40
50
60
70
80
90
So, the mc_ginley metric (idk lol) lags a lot less than simple moving average. Now. to implement PGO (or PGI, whatever floats your boat) I'll have to first implement EMA. Which should be easy.
def EMA(hist, n, col):
c = hist
c['ewm'] = c[col].ewm(span=n,min_periods=0,adjust=False,ignore_na=False).mean()
return c
#works splendidly, cross checked from trading view.
def wwma(values, n):
"""
J. Welles Wilder's EMA
"""
return values.ewm(alpha=1/n, adjust=False).mean()
def ATR(df, n=14):
data = df.copy()
high = data['High']
low = data['Low']
close = data['Close']
data['tr0'] = abs(high - low)
data['tr1'] = abs(high - close.shift())
data['tr2'] = abs(low - close.shift())
tr = data[['tr0', 'tr1', 'tr2']].max(axis=1)
atr = wwma(tr, n)
df['atr'] = atr
return df
def PGO(n):
c = SMA(n)
#this gives us a dataframe with SMA. I do NOT want to code ever again.
c = ATR(c, n)
#this will give us a dataframe with average
c = EMA(c, n, 'atr')
c.dropna(inplace=True)
c['PGO'] = (c['Close'] - c['SMA'])/c['ewm']
return c
#works splendidly. Checked.
Technical Indicators implemented. I have little to no idea what to do with them. Sure, I can graph them but what next? If only I knew how to trade....
Now for the fundamental indicators, EPS is of no use by itself. So I'll modify the data fetching function and just extract the profit and EPS rows for banks. Then compare. Then sleep.
BANKS_NIFTY_200 = {
"AUBANK" : "https://www.moneycontrol.com/financials/ausmallfinancebank/profit-lossVI/ASF02/{}#ASF02",
"AXISBANK" : "https://www.moneycontrol.com/financials/axisbank/profit-lossVI/AB16/{}#AB16",
"BANDHANBANK" : "https://www.moneycontrol.com/financials/bandhanbank/profit-lossVI/BB09/{}#BB09",
"BANKBARODA" : "https://www.moneycontrol.com/financials/bankofbaroda/profit-lossVI/BOB/{}#BOB",
"BANKINDIA" : "https://www.moneycontrol.com/financials/bankofindia/profit-lossVI/BOI/{}#BOI",
"CANBANK" : "https://www.moneycontrol.com/financials/canarabank/profit-lossVI/CB06/{}#CB06",
"HDFCBANK" : "https://www.moneycontrol.com/financials/hdfcbank/profit-lossVI/HDF01/{}#HDF01",
"ICICIBANK" : "https://www.moneycontrol.com/financials/icicibank/profit-lossVI/ICI02/{}#ICI02",
"INDIANB" : "https://www.moneycontrol.com/financials/indianbank/profit-lossVI/IB04/{}#IB04",
"INDUSINDBK" : "https://www.moneycontrol.com/financials/indusindbank/profit-lossVI/IIB/{}#IIB",
"KOTAKBANK" : "https://www.moneycontrol.com/financials/kotakmahindrabank/profit-lossVI/KMB/{}#KMB",
"PNB" : "https://www.moneycontrol.com/financials/punjabnationalbank/profit-lossVI/PNB05/{}#PNB05",
"SBIN" : "https://www.moneycontrol.com/financials/statebankofindia/profit-lossVI/SBI/{}#SBI",
"UNIONBANK" : "https://www.moneycontrol.com/financials/unionbankofindia/profit-lossVI/UBI01/{}#UBI01",
"YESBANK" : "https://www.moneycontrol.com/financials/yesbank/profit-lossVI/YB/{}#YB"
}
#god did this take a lot of time lololol
#Also I missed IDFC and IDBI banks because they suck. Yes.
Now a simple scraper that would collate all data. We just need EPS and Profits over the years. Nothing More, and certainly nothing less.
def scrape(url):
urls = ['/1', '/2', '/3']
data = dict()
r = requests.get(url)
soup_ = BeautifulSoup(r.text)
z = soup_.find_all("span", class_="span_price_wrap")
# print(z)
try:
data["price"] = eval(z[0].text)
except:
data["price"] = "NOT"
for i in urls:
r = requests.get(url.format(i))
soup = BeautifulSoup(r.text)
z = soup.find_all('tr') #bingo we have the data. Now we just have to sanitize it.
for i in z:
if (i.text):
tds = i.find_all('td')
tds = [x.text for x in tds][:-1]
for l in range(len(tds)):
try:
tds[l] = eval(tds[l].replace(",", ""))
except:
pass
header = tds[0]
if "Profit & Loss" in header:
if(header in data):
data[header] += tds[1:]
else:
data[header] = tds[1:]
if header in ["Net Profit / Loss for The Year", "Basic EPS (Rs.)", "Diluted EPS (Rs.)"]:
if header != '\xa0' and '' not in tds:
# print(tds)
# write.writerow(tds)
if(header in data):
data[header] += tds[1:]
else:
data[header] = tds[1:]
return data
all_banks = dict()
for i in BANKS_NIFTY_200:
all_banks[i] = scrape(BANKS_NIFTY_200[i])
for i in all_banks:
print(i)
for j in all_banks[i]:
print(j, all_banks[i][j])
print()
AUBANK price 550.9 Profit & Loss account of AU Small Finance Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13'] Net Profit / Loss for The Year [1129.83, 1170.68, 674.78, 381.81, 292.04, 821.98, -799.78, -547.57, -492.97, -341.3] Basic EPS (Rs.) [36.06, 38.19, 22.78, 13.16, 10.26, 30.18, 9.28, 5.31, 2.99, 3.4] Diluted EPS (Rs.) [35.69, 37.86, 22.32, 12.9, 10.0, 29.61, 9.28, 5.31, 2.99, 2.86] AXISBANK price 662.5 Profit & Loss account of Axis Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [13025.48, 6588.5, 1627.22, 4676.61, 275.68, 3679.28, 8223.66, 7357.82, 6217.67, 5179.43, 4242.21, 3388.49] Basic EPS (Rs.) [42.48, 22.15, 5.99, 18.2, 1.13, 15.4, 34.59, 31.0, 132.56, 119.67, 102.94, 82.95] Diluted EPS (Rs.) [42.35, 22.09, 5.97, 18.09, 1.12, 15.34, 34.4, 31.0, 132.23, 118.85, 102.2, 81.61] BANDHANBANK price 270.2 Profit & Loss account of Bandhan Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15'] Net Profit / Loss for The Year [125.79, 2205.46, 3023.74, 1951.5, 1345.56, 1111.95, 275.25, 0.58] Basic EPS (Rs.) [0.78, 13.7, 18.78, 16.36, 12.26, 10.15, 3.4, 0.01] Diluted EPS (Rs.) [0.78, 13.69, 18.76, 16.34, 12.26, 10.15, 3.4, 0.01] BANKBARODA price 104.05 Profit & Loss account of Bank Of Baroda (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [7272.28, 828.96, 546.19, 433.52, -2431.81, 1383.14, -5395.54, 3398.44, 4541.08, 4480.72, 5006.96, 4241.68] Basic EPS (Rs.) [14.06, 1.78, 1.36, 1.64, -10.53, 6.0, -23.89, 15.83, 107.38, 108.84, 127.84, 116.37] Diluted EPS (Rs.) [14.06, 1.78, 1.36, 1.41, -10.53, 6.0, -23.89, 15.83, 107.38, 108.84, 127.84, 116.37] BANKINDIA price 45.55 Profit & Loss account of Bank Of India (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [3404.7, 2160.3, -2956.89, -5546.9, -6043.71, -1558.31, -6089.21, 1708.92, 2729.27, 2749.35, 2677.52, 2488.71] Basic EPS (Rs.) [8.84, 6.59, -9.1, -29.79, -52.55, -15.72, -83.01, 26.57, 44.74, 47.79, 48.98, 47.35] Diluted EPS (Rs.) [8.84, 6.59, -9.1, -29.79, -52.55, -15.72, -83.01, 26.57, 44.74, 47.79, 48.98, 47.35] CANBANK price 206.05 Profit & Loss account of Canara Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [5678.41, 2557.58, -2235.72, 347.02, -4222.24, 1121.92, -2812.82, 2702.62, 2438.19, 2872.1, 3282.71, 4025.89] Basic EPS (Rs.) [32.49, 16.91, -26.5, 4.71, -70.47, 20.63, -53.61, 58.59, 54.48, 64.83, 74.1, 97.83] Diluted EPS (Rs.) [32.49, 16.91, -26.5, 4.71, -70.47, 20.63, -53.61, 58.59, 54.48, 64.83, 74.1, 97.83] HDFCBANK price 1363.85 Profit & Loss account of HDFC Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [36961.36, 31116.53, 26257.32, 21078.17, 17486.73, 14549.64, 12296.21, 10215.92, 8478.38, 6726.28, 5167.09, 3926.4] Basic EPS (Rs.) [66.8, 56.58, 48.01, 78.65, 67.76, 57.18, 48.84, 42.0, 35.47, 28.49, 22.11, 17.0] Diluted EPS (Rs.) [66.35, 56.32, 47.66, 77.87, 66.84, 56.43, 48.26, 42.0, 35.21, 28.18, 21.91, 16.81] ICICIBANK price 751.1 Profit & Loss account of ICICI Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [23339.49, 16192.68, 7930.81, 3363.3, 6777.42, 9801.09, 9726.29, 11175.35, 9810.48, 8325.47, 6465.26, 5151.38] Basic EPS (Rs.) [33.66, 24.01, 12.28, 5.23, 10.56, 15.31, 16.75, 19.32, 84.99, 72.2, 56.11, 45.27] Diluted EPS (Rs.) [32.98, 23.67, 12.08, 5.17, 10.46, 15.25, 16.65, 19.13, 84.65, 71.93, 55.95, 45.06] INDIANB price 171.6 Profit & Loss account of Indian Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [3944.82, 3004.68, 753.36, 321.95, 1258.99, 1405.68, 711.38, 1005.17, 1158.95, 1581.14, 1746.97, 1714.07] Basic EPS (Rs.) [32.38, 26.61, 14.33, 6.7, 26.21, 29.27, 14.81, 21.62, 26.07, 35.8, 39.57, 38.79] Diluted EPS (Rs.) [32.38, 26.61, 14.33, 6.7, 26.21, 29.27, 14.81, 21.62, 26.07, 35.8, 39.57, 38.79] INDUSINDBK price 815.15 Profit & Loss account of IndusInd Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [4611.12, 2836.39, 4417.91, 3301.1, 3605.99, 2867.89, 2286.45, 1793.72, 1408.02, 1061.18, 802.61, 577.33] Basic EPS (Rs.) [59.57, 38.75, 63.75, 54.9, 60.19, 48.06, 39.68, 34.0, 27.0, 21.83, 17.2, 13.16] Diluted EPS (Rs.) [59.47, 38.68, 63.52, 54.46, 59.57, 47.56, 39.26, 33.0, 26.0, 21.4, 16.86, 12.88] KOTAKBANK price 1787.6 Profit & Loss account of Kotak Mahindra Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [8572.69, 6964.84, 5947.18, 4865.33, 4084.3, 3411.5, 2089.78, 1865.98, 1502.52, 1360.72, 1085.05, 818.18] Basic EPS (Rs.) [43.02, 35.17, 30.88, 25.52, 21.54, 18.57, 11.42, 24.2, 19.62, 18.31, 14.69, 11.35] Diluted EPS (Rs.) [43.01, 35.14, 30.84, 25.48, 21.51, 18.55, 11.4, 24.14, 19.59, 18.24, 14.61, 11.28] PNB price 30.3 Profit & Loss account of Punjab National Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [3456.96, 2021.62, 336.19, -9975.49, -12282.82, 1324.8, -3974.4, 3061.58, 3342.58, 4747.67, 4884.2, 4433.5] Basic EPS (Rs.) [3.16, 2.08, 0.62, -30.94, -55.39, 6.45, -20.82, 16.91, 93.91, 139.52, 154.02, 140.6] Diluted EPS (Rs.) [3.16, 2.08, 0.62, -30.94, -55.39, 6.45, -20.82, 16.91, 93.91, 139.52, 154.02, 140.6] SBIN price 479.0 Profit & Loss account of State Bank of India (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [31675.98, 20410.47, 14488.11, 862.23, -6547.45, 10484.1, 9950.65, 13101.57, 10891.17, 14104.98, 11707.29, 8264.52] Basic EPS (Rs.) [35.49, 22.87, 16.23, 0.97, -7.67, 13.43, 12.98, 17.55, 15.68, 210.06, 184.31, 130.16] Diluted EPS (Rs.) [35.49, 22.87, 16.23, 0.97, -7.67, 13.43, 12.98, 17.55, 15.68, 210.06, 184.31, 130.16] UNIONBANK price 36.5 Profit & Loss account of Union Bank of India (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [5232.1, 2905.97, -2897.78, -2947.45, -5247.37, 555.21, 1351.6, 1781.64, 1696.2, 2157.93, 1787.14, 2081.95] Basic EPS (Rs.) [7.73, 4.54, -12.49, -25.08, -69.45, 8.08, 20.42, 28.05, 27.99, 38.93, 34.07, 39.71] Diluted EPS (Rs.) [7.73, 4.54, -12.49, -25.08, -69.45, 8.08, 20.42, 28.05, 27.99, 38.93, 34.07, 39.71] YESBANK price 13.26 Profit & Loss account of Yes Bank (in Rs. Cr.) ['Mar 22', 'Mar 21', 'Mar 20', 'Mar 19', 'Mar 18', 'Mar 17', 'Mar 16', 'Mar 15', 'Mar 14', 'Mar 13', 'Mar 12', 'Mar 11'] Net Profit / Loss for The Year [1066.21, -3462.23, -16418.03, 1720.28, 4224.56, 3330.1, 2539.45, 2005.36, 1617.78, 1300.68, 977.0, 727.14] Basic EPS (Rs.) [0.43, -1.63, -56.07, 7.45, 18.43, 15.78, 60.62, 49.0, 45.0, 37.0, 28.0, 21.0] Diluted EPS (Rs.) [0.43, -1.63, -56.06, 7.38, 18.06, 15.35, 59.31, 48.0, 44.0, 36.0, 27.0, 20.0]
PE_ratio = {i : all_banks[i]['price']/all_banks[i]['Basic EPS (Rs.)'][0] for i in all_banks} #slick, innit?
EARNINGS_GROWTH = {i : (all_banks[i]['Basic EPS (Rs.)'][0] - all_banks[i]['Basic EPS (Rs.)'][1])/(all_banks[i]['Basic EPS (Rs.)'][1]) for i in all_banks}
EARNINGS_GROWTH = {i: EARNINGS_GROWTH[i]*100 for i in EARNINGS_GROWTH}
EPG = {i: PE_ratio[i]/EARNINGS_GROWTH[i] for i in PE_ratio}
PEG = {}
for i in EPG:
if EPG[i] >= 0:
PEG[i] = EPG[i] #because negative values are absurd
PE_R = {}
for i in PE_ratio:
if PE_ratio[i] >= 0: #negative values absurd again. Loss making companies deserve love too.
PE_R[i] = PE_ratio[i]
plt.plot([i for i in PEG], [PEG[i] for i in PEG])
plt.xticks(rotation=45, ha='right')
plt.title("PEG Ratio for Banking Scrips")
plt.show()
plt.plot([i for i in PE_R], [PE_R[i] for i in PE_R])
plt.xticks(rotation = 45, ha = 'right')
plt.title("P/E Ratio for Banking Scrips")
plt.show()
#BOB the most undervalued. Like Myself.